Robust Combining of Disparate Classi ers through Order Statistics
نویسندگان
چکیده
Integrating the outputs of multiple classiiers via combiners or meta-learners has led to substantial improvements in several diicult pattern recognition problems. In this article we investigate a family of combiners based on order statistics, for robust handling of situations where there are large discrepancies in performance of individual classiiers. Based on a mathematical modeling of how the decision boundaries are aaected by order statistic combiners, we derive expressions for the reductions in error expected when simple output combination methods based on the the median, the maximum and in general, the i th order statistic, are used. Furthermore, we analyze the trim and spread combiners, both based on linear combinations of the ordered classiier outputs, and show that in the presence of uneven classiier performance, they often provide substantial gains over both linear and simple order statistics combiners. Experimental results on both real world data and standard public domain data sets corroborate these ndings.
منابع مشابه
Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
Combining multiple classi ers is an e ective technique for improving accuracy. There are many general combining algorithms, such as Bagging or Error Correcting Output Coding, that signi cantly improve classi ers like decision trees, rule learners, or neural networks. Unfortunately, many combining methods do not improve the nearest neighbor classi er. In this paper, we present MFS, a combining a...
متن کاملError Correlation and Error Reduction in Ensemble Classifiers
Using an ensemble of classi ers, instead of a single classi er, can lead to improved generalization. The gains obtained by combining however, are often a ected more by the selection of what is presented to the combiner, than by the actual combining method that is chosen. In this paper we focus on data selection and classi er training methods, in order to \prepare" classi ers for combining. We r...
متن کاملNearest Neighbor Classi cation from Multiple Feature Subsets
Combining multiple classi ers is an e ective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding, that signi cantly improve classi ers like decision trees, rule learners, or neural networks. Unfortunately, these combining methods do not improve the nearest neighbor classi er. In this paper, we present MFS, a...
متن کاملDouble-bagging: combining classifiers by bootstrap aggregation
The combination of classi"ers leads to substantial reduction of misclassi"cation error in a wide range of applications and benchmark problems. We suggest using an out-of-bag sample for combining di0erent classi"ers. In our setup, a linear discriminant analysis is performed using the observations in the out-of-bag sample, and the corresponding discriminant variables computed for the observations...
متن کاملAdaptive Selection of Image Classifiers
Recently, the concept of \Multiple Classi er Systems" was proposed as a new approach to the development of high performance image classi cation systems. Multiple Classi er Systems can be used to improve classi cation accuracy by combining the outputs of classi ers making \uncorrelated" errors. Unfortunately, in real image recognition problems, it may be very di cult to design an ensemble of cla...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001